نتایج جستجو برای: Divergence measures

تعداد نتایج: 399991  

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

Journal: :Applied Mathematics & Information Sciences 2013

2005
INDER JEET TANEJA

Abstract. Arithmetic, geometric and harmonic means are the three classical means famous in the literature. Another mean such as square-root mean is also known. In this paper, we have constructed divergence measures based on nonnegative differences among these means, and established an interesting inequality by use of properties of Csiszár’s f-divergence. An improvement over this inequality is a...

Journal: :Mathematical Research Letters 2002

Journal: :International Journal of Approximate Reasoning 2005

X. X. He Y. F. Li

Srivastava and Maheshwari (Iranian Journal of Fuzzy Systems 13(1)(2016) 25-44) introduced a new divergence measure for intuitionisticfuzzy sets (IFSs). The properties of the proposed divergence measurewere studied and the efficiency of the proposed divergence measurein the context of medical diagnosis was also demonstrated. In thisnote, we point out some errors in ...

1999
E. Torres Pedro Miranda Pedro Gil

2005
INDER JEET TANEJA

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s [17] relative information and Jeffreys [16] J-divergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied anoth...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید